Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Knowledge representation learning method incorporating entity description information and neighbor node features
Shoulong JIAO, Youxiang DUAN, Qifeng SUN, Zihao ZHUANG, Chenhao SUN
Journal of Computer Applications    2022, 42 (4): 1050-1056.   DOI: 10.11772/j.issn.1001-9081.2021071227
Abstract346)   HTML23)    PDF (671KB)(168)       Save

Knowledge graph representation learning aims to map entities and relations into a low-dimensional dense vector space. Most existing related models pay more attention to learn the structural features of the triples while ignoring the semantic information features of the entity relationships within the triples and the entity description information features outside the triples, so that the abilities of knowledge expression of these models are poor. In response to the above problem, a knowledge representation learning method BAGAT (knowledge representation learning based on BERT model And Graph Attention Network) was proposed by fusing multi-source information. First, the entity target nodes and neighbor nodes of the triples were constructed by combining knowledge graph features, and Graph Attention Network (GAT) was used to aggregate the semantic information representation of the triple structure. Then, the Bidirectional Encoder Representations from Transformers (BERT) word vector model was used to perform the embedded representation of entity description information. Finally, the both representation methods were mapped to the same vector space for joint knowledge representation learning. Experimental results show that BAGAT has a large improvement compared to other models. Among the indicators Hits@1 and Hits@10 on the public dataset FB15K-237, compared with the translation model TransE (Translating Embeddings), BAGAT is increased by 25.9 percentage points and 22.0 percentage points respectively, and compared with the graph neural network model KBGAT (Learning attention-based embeddings for relation prediction in knowledge graphs), BAGAT is increased by 1.8 percentage points and 3.5 percentage points respectively, indicating that the multi-source information representation method incorporating entity description information and semantic information of the triple structure can obtain stronger representation learning capability.

Table and Figures | Reference | Related Articles | Metrics